Topic 2 - ROS and point clouds

Goals of the Class

In this class, you will use ROS2 (Robot Operating System 2) to perform 2D SLAM (Simultaneous Localization and Mapping) and AMCL (Adaptive Monte Carlo Localization) with a simulated Turtlebot3 robot. Moreover, you will map the real 3D environment using data recorded from 3D LIDAR sensor. You will:

Before You Begin

To complete the exercises, you’ll need two Docker images: arm/lab03 (for 2D SLAM and navigation) and arm/lab06 (for 3D SLAM). These images contain pre-configured ROS2 environments with all required tools and dependencies.

Check and Download Docker Images

Verify if the images are already on your computer:

docker images

Look for arm/lab03 and arm/lab06 in the output.

If they’re missing, download them:

For arm/lab03:

wget --content-disposition --no-check-certificate https://chmura.put.poznan.pl/s/pszNFePmGXxu1XX/download

For arm/lab06:

wget --content-disposition --no-check-certificate https://chmura.put.poznan.pl/s/B1td9ifRL1S0js9/download

Load the downloaded .tar.gz files into Docker:

docker load < path/to/file.tar.gz

ROS2 Introduction / Recap

ROS2 Humble Hawksbill

The Robot Operating System (ROS) is a set of open-source development libraries and tools for building robotic applications. We will use one of the latest stable versions of ROS2, Humble Hawksbill.

Key ROS2 Concepts

Here’s a breakdown of essential terms:

Multiple Publisher and Multiple Subscriber

ROS2 Environment (Workspace)

A ROS environment is the place where packages are stored, e.g. for a particular robot. There can be many different environments on one computer (e.g. ros2_robotA_ws, ros2_robotB_ws). A typical workspace looks like this:

ros2_ws
├── build/    # Temporary files for building packages
├── install/  # Compiled packages ready to use
├── log/      # Build process logs
└── src/      # Your source code and packages

Building the Workspace

Use the colcon tool to build the workspace:

cd ros2_ws
colcon build

Activating the Workspace

After building, “source” the environment to access your packages in the terminal:

source install/setup.bash

Note: Run this command in every new terminal session to work with your workspace.

Node Operations

Starting nodes is done via the command:

ros2 run package_name node_name

It is possible to group nodes allowing them to be run collectively. The launch files are used for this. Invoking an existing launch file is done by the command:

ros2 launch package_name launch_name

Topic Operations

Viewing the current list of topics is done using the command:

ros2 topic list

Reading messages from the topic:

ros2 topic echo topic_name

A single topic can have multiple publishers as well as subscribers. Information about them, as well as the type of message being exchanged, can be checked with the command:

ros2 topic info topic_name

It is also possible to publish messages on a topic from the terminal:

ros2 topic pub topic_name message_type 'message_data'

Useful Tools

rviz2

RViz

gazebo

Gazebo

ROS2 Bag: Recording and Playback

ROS2 Bag lets you record and replay topic data, useful for testing without a live robot:

Multi-Computer Setup

Nodes in the same domain can freely detect and send messages to each other. All ROS2 nodes use domain ID 0 by default. Within the laboratory, it is necessary to set a separate unique domain ID for each computer to avoid interference. To do this, read the number from the sticker stuck to the monitor and substitute it in the following command in place of NR_COMPUTER. If there is no sticker on your computer, select a number between 0-101 or 215-232.

grep -q ^'export ROS_DOMAIN_ID=' ~/.bashrc || echo 'export ROS_DOMAIN_ID=NR_COMPUTER' >> ~/.bashrc
source ~/.bashrc

Part 1: 2D Point Cloud SLAM and Navigation in a Simple Environment

In this part, you’ll use a simulated Turtlebot3 to build a 2D map with SLAM and localize it within a known map with AMCL.

Key Concepts

Advantages of cleaning robots using SLAM.
Advantages of cleaning robots using SLAM. source
Arrows are particles calculated by AMCL. source

Environment preparation

  1. Load the Docker Image: Use arm/lab03 (see Before You Begin).
  2. Create a Contrainer:
wget <script_url>
bash run_*.sh

The container is named ARM_03 by default.

  1. Build the Workspace:
cd /arm_ws
source /opt/ros/humble/setup.bash
colcon build --symlink-install
source install/setup.bash

NOTE: You can attach a new terminal to the container using the following command: docker exec -it ARM_03 bash

IMPORTANT:

  1. Make sure to source the built environment and set the robot model in every terminal inside the container: source install/setup.bash; export TURTLEBOT3_MODEL=burger

  2. Set the environment variable ROS_DOMAIN_ID in a container as instructed here.

Building the World Map

  1. Launch Gazebo Simulation:
export TURTLEBOT3_MODEL=burger
ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
  1. Run Cartographer SLAM (new terminal):
export TURTLEBOT3_MODEL=burger
source install/setup.bash
ros2 launch turtlebot3_cartographer cartographer.launch.py use_sim_time:=True

RViz will show the map-building process.

  1. Move the Robot with the teleop node for keyboard operation (new terminal):
source install/setup.bash
ros2 run turtlebot3_teleop teleop_keyboard

Then, using the keys w, a, s, d and x, you need to control the robot so that the entire “world” map is built.

“turtlebot3_world” with a map built. source
  1. Save the Map (new_terminal, while the SLAM and simulation are running):
source install/setup.bash
mkdir maps
ros2 run nav2_map_server map_saver_cli -f /arm_ws/maps/turtlebot3_world_map

This creates a .yaml and .pgm file representing the map.

  1. Turn off the Simulation with Ctrl+C in all terminals.
  1. Launch Gazebo Simulation:
export TURTLEBOT3_MODEL=burger
ros2 launch turtlebot3_gazebo turtlebot3_world.launch.py
  1. Launch Navigation2 Node with AMCL method (new terminal):
export TURTLEBOT3_MODEL=burger
ros2 launch turtlebot3_navigation2 navigation2.launch.py use_sim_time:=True map:=/arm_ws/maps/turtlebot3_world_map.yaml
  1. Set Initial Pose in RViz:
source

Before the next step, you can also move the robot back and forth a bit utilizing previously used teleop node to collect the surrounding environment information and narrow down the estimated location of the robot on the map which is displayed with tiny green arrows.

  1. Set Navigation Goal:
source

Play with Parameters

In the turtlebot3_navigation2 package, there is a config file /arm_ws/src/turtlebot3/turtlebot3_navigation2/param/burger.yaml. You can verify how the modification of the following parameters affect the AMCL module operation:

  1. laser_max_range and laser_min_range
  2. max_beams
  3. max_particles
  4. resample_interval
  5. update_min_a and update_min_d

Install you favorite editor if needed, e.g.:

apt update
apt install vim

Part 2: 3D Point Cloud SLAM

Now, you’ll use lidarslam to build a 3D map from LIDAR data and analyze its performance.

lidarslam: A ROS2 package for 3D SLAM using point clouds. It uses scan matching method to calculate the relative transformation between consecutive LIDAR scans to get the initial estimate of the motion (i.e., Normal Distributions Transform (NDT) by default). Moreover, it refines the intial pose estimates and ensures long-term consistency of the map by graph-based pose optimization. It includes loop closure mechanism.

Loop closure is a technique in SLAM where the system recognizes when the robot has returned to a previously visited location. When a loop closure is detected, the system can correct accumulated drift errors by adjusting the entire trajectory and map. This results in a more accurate and consistent map, especially for long trajectories where odometry errors would otherwise accumulate.

Map created by lidarslam

Environment Preparation

  1. Load the Docker Image: Use arm/lab06 (see Before You Begin).
  2. Create a Container:
wget <script_url>
bash run_*.sh

The container is named ARM_06 by default.

  1. Build the Workspace:
cd /arm_ws
source /opt/ros/humble/setup.bash
colcon build --symlink-install
source install/setup.bash

NOTE: You can attach a new terminal to the container using the following command: docker exec -it ARM_06 bash

Running lidarslam

cd /arm_ws
source install/setup.bash
ros2 launch lidarslam lidarslam.launch.py

RViz window should appear, where the localization and map building process will be visualized.

HDL_400

Play back the data recorded using Velodyne VLP-32 LIDAR sensor.

  1. Play the bag file:
ros2 bag play -p -r 0.5 bags/hdl_400

The replay process will start paused and with a rate of 0.5 of the normal speed.

  1. Add a PointCloud2 data type to the visualization from the /velodyne_points topic in RViz. It contains the “current” readings from the LIDAR.

  2. Unpause the replay process of bag file by using space key in the appropriate terminal.

  3. Observe the difference between maps from /map topic (raw map) and /modified_map topic (optimized map). Similarly observe the difference between /path (yellow) and /modified_path (green) topics. Unfortunately, there is no ground truth localization for this data, but you can see the map optimization process based on loop closure mechanism.

KITTI 00

A bag file with 200 first scans from the 00 sequence of the KITTI dataset was prepared. The data also contain ground truth localization, which can be used to assess the system performance.

  1. Restart lidarslam:
ros2 launch lidarslam lidarslam.launch.py
  1. Play the bag file:
ros2 bag play -p bags/kitti
  1. Add a Path data type to the visualization from the /path_gt_lidar topic in RViz. Additionally, change it’s color to distinguish it from different paths (yellow and green).

  2. Unpause the replay process of bag file by using space key in the appropriate terminal.

  3. Observe the difference between the ground truth line and the path returned by SLAM.

You will likely observe that the SLAM algorithm processes data too slowly, resulting in a jagged trajectory that is significantly shorter than the ground truth. To verify this, repeat the experiment while playing the rosbag at a reduced speed (e.g., using the -r 0.3 argument in the ros2 bag play command).

Play with the parameters

Analyze the lidarslam documentation and observe the system in operation to verify the impact of the parameters located in /arm_ws/src/lidarslam_ros2/lidarslam/param/lidarslam.yaml. Adjust these parameters to ensure the SLAM system operates somewhat accurately in real-time:

  1. ndt_resolution
  2. trans_for_mapupdate
  3. voxel_leaf_size
  4. loop_detection_period
  5. threshold_loop_closure_score
  6. distance_loop_closure
  7. range_of_searching_loop_closure
  8. search_submap_num

Sources and useful references